首页> 外文OA文献 >Scaled stochastic gradient descent for low-rank matrix completion
【2h】

Scaled stochastic gradient descent for low-rank matrix completion

机译:用于低秩矩阵完成的缩放随机梯度下降

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

The paper looks at a scaled variant of the stochastic gradient descentalgorithm for the matrix completion problem. Specifically, we propose a novelmatrix-scaling of the partial derivatives that acts as an efficientpreconditioning for the standard stochastic gradient descent algorithm. Thisproposed matrix-scaling provides a trade-off between local and global secondorder information. It also resolves the issue of scale invariance that existsin matrix factorization models. The overall computational complexity is linearwith the number of known entries, thereby extending to a large-scale setup.Numerical comparisons show that the proposed algorithm competes favorably withstate-of-the-art algorithms on various different benchmarks.
机译:本文研究了矩阵完成问题的随机梯度下降算法的比例变体。具体来说,我们提出了一种偏导数的新颖矩阵缩放方法,它可作为标准随机梯度下降算法的有效预处理。提议的矩阵缩放在本地和全局二阶信息之间进行权衡。它还解决了矩阵分解模型中存在的尺度不变性问题。整体计算复杂度与已知条目的数量呈线性关系,从而扩展到大规模设置。数值比较表明,所提出的算法在各种不同基准上均能与最新算法竞争。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号